skip to main content


Search for: All records

Creators/Authors contains: "Karanikolas, Georgios V."

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Partial correlations (PCs) and the related inverse covariance matrix adopted by graphical lasso, are widely applicable tools for learning graph connectivity given nodal observations. The resultant estimators however, can be sensitive to outliers. Robust approaches developed so far to cope with outliers do not (explicitly) account for nonlinear interactions possibly present among nodal processes. This can hurt the identification of graph connectivity, merely due to model mismatch. To overcome this limitation, a novel formulation of robust PC is introduced based on nonlinear kernel functions. The proposed scheme leverages robust ridge regression techniques, spectral Fourier feature based kernel approximants, and robust association measures. Numerical tests on synthetic and real data illustrate the potential of the novel approach. 
    more » « less
  2. Higher-order link prediction (HOLP) seeks missing links capturing dependencies among three or more network nodes. Predicting high-order links (HOLs) can for instance reveal hyperlinks in the structure of drug substance and metabolic networks. Existing methods either make restrictive assumptions regarding the emergence of HOLs, or, they rely on reduced dimensionality models of limited expressiveness. To overcome these limitations, the HOLP approach developed here leverages distribution similarities across embeddings as captured by a learnable probability metric. The intuition underpinning the novel approach is that sets of nodes whose embeddings are less similar in distribution, are less likely to be connected by a HOL. Specifically, nonlinear dimensionality reduction is effected through a Gaussian process latent variable model that yields nodal embeddings, and also learns a data-driven similarity function (kernel). This kernel forms the core of a maximum mean discrepancy probability metric. Tests on benchmark datasets illustrate the potential of the proposed approach. 
    more » « less
    Free, publicly-accessible full text available June 4, 2024
  3. null (Ed.)